Newton-Stein Method: An Optimization Method for GLMs via Stein's Lemma
نویسنده
چکیده
We consider the problem of efficiently computing the maximum likelihood estimator in Generalized Linear Models (GLMs) when the number of observations is much larger than the number of coefficients (n p 1). In this regime, optimization algorithms can immensely benefit from approximate second order information. We propose an alternative way of constructing the curvature information by formulating it as an estimation problem and applying a Stein-type lemma, which allows further improvements through sub-sampling and eigenvalue thresholding. Our algorithm enjoys fast convergence rates, resembling that of second order methods, with modest per-iteration cost. We provide its convergence analysis for the general case where the rows of the design matrix are samples from a subGaussian distribution. We show that the convergence has two phases, a quadratic phase followed by a linear phase. Finally, we empirically demonstrate that our algorithm achieves the highest performance compared to various optimization algorithms on several data sets.
منابع مشابه
Newton-Stein Method: A Second Order Method for GLMs via Stein's Lemma
We consider the problem of efficiently computing the maximum likelihood estimator in Generalized Linear Models (GLMs) when the number of observations is much larger than the number of coefficients (n p 1). In this regime, optimization algorithms can immensely benefit from approximate second order information. We propose an alternative way of constructing the curvature information by formulating...
متن کاملAn efficient improvement of the Newton method for solving nonconvex optimization problems
Newton method is one of the most famous numerical methods among the line search methods to minimize functions. It is well known that the search direction and step length play important roles in this class of methods to solve optimization problems. In this investigation, a new modification of the Newton method to solve unconstrained optimization problems is presented. The significant ...
متن کاملA Stein ' s Method Proof of the Asymptotic Normality of Descents and Inversions in theSymmetric
Let W ( ) be either the number of descents or inversions of a permutation 2 Sn. Stein's method is applied to bound the sup-norm distance between the distribution of W and the standard normal distribution. This appears to be the rst use of Stein's method in theory of permutation statistics. The construction of an exchangeable pair (W;W 0) used in Stein's method is non-trivial and may be of indep...
متن کاملA ug 2 00 9 SPIN GLASSES AND STEIN ’ S METHOD
We introduce some applications of Stein's method in the high temperature analysis of spin glasses. Stein's method allows the direct analysis of the Gibbs measure without having to create a cavity. Another advantage is that it gives limit theorems with total variation error bounds, although the bounds can be suboptimal. A surprising byproduct of our analysis is a relatively transparent explanati...
متن کاملA Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems
In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Journal of Machine Learning Research
دوره 17 شماره
صفحات -
تاریخ انتشار 2016